200 research outputs found

    Using audio stimuli in acceptability judgment experiments

    Full text link
    In this paper, we argue that moving away from written stimuli in acceptability judgment experiments is necessary to address the systematic exclusion of particular empirical phenomena, languages/varieties, and speakers in psycholinguistics. We provide user‐friendly guidelines for conducting acceptability experiments which use audio stimuli in three platforms: Praat, Qualtrics, and PennController for Ibex. In supplementary materials, we include data and R script from a sample experiment investigating English constituent order using written and audio stimuli. This paper aims not only to increase the types of languages, speakers, and phenomena which are included in experimental syntax, but also to help researchers who are interested in conducting experiments to overcome the initial learning curve. Video Abstract link: https://www.youtube.com/watch?v=GoWYY1O9ugsPeer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/156434/2/lnc312377_am.pdfhttp://deepblue.lib.umich.edu/bitstream/2027.42/156434/1/lnc312377.pd

    Shared neural correlates for building phrases in signed and spoken language

    Get PDF
    Abstract Research on the mental representation of human language has convincingly shown that sign languages are structured similarly to spoken languages. However, whether the same neurobiology underlies the online construction of complex linguistic structures in sign and speech remains unknown. To investigate this question with maximally controlled stimuli, we studied the production of minimal two-word phrases in sign and speech. Signers and speakers viewed the same pictures during magnetoencephalography recording and named them with semantically identical expressions. For both signers and speakers, phrase building engaged left anterior temporal and ventromedial cortices with similar timing, despite different linguistic articulators. Thus the neurobiological similarity of sign and speech goes beyond gross measures such as lateralization: the same fronto-temporal network achieves the planning of structured linguistic expressions

    Deaf readers benefit from lexical feedback during orthographic processing

    Get PDF
    It has been proposed that poor reading abilities in deaf readers might be related to weak connections between the orthographic and lexical-semantic levels of processing. Here we used event related potentials (ERPs), known for their excellent time resolution, to examine whether lexical feedback modulates early orthographic processing. Twenty congenitally deaf readers made lexical decisions to target words and pseudowords. Each of those target stimuli could be preceded by a briefly presented matched-case or mismatched-case identity prime (e.g., ALTAR-ALTAR vs. altar- ALTAR). Results showed an early effect of case overlap at the N/P150 for all targets. Critically, this effect disappeared for words but not for pseudowords, at the N250—an ERP component sensitive to orthographic processing. This dissociation in the effect of case for word and pseudowords targets provides strong evidence of early automatic lexical-semantic feedback modulating orthographic processing in deaf readers. Interestingly, despite the dissociation found in the ERP data, behavioural responses to words still benefited from the physical overlap between prime and target, particularly in less skilled readers and those with less experience with words. Overall, our results support the idea that skilled deaf readers have a stronger connection between the orthographic and the lexical-semantic levels of processing

    Language experience impacts brain activation for spoken and signed language in infancy: Insights from unimodal and bimodal bilinguals

    Get PDF
    Recent neuroimaging studies suggest that monolingual infants activate a left lateralised fronto-temporal brain network in response to spoken language, which is similar to the network involved in processing spoken and signed language in adulthood. However, it is unclear how brain activation to language is influenced by early experience in infancy. To address this question, we present functional near infrared spectroscopy (fNIRS) data from 60 hearing infants (4-to-8 months): 19 monolingual infants exposed to English, 20 unimodal bilingual infants exposed to two spoken languages, and 21 bimodal bilingual infants exposed to English and British Sign Language (BSL). Across all infants, spoken language elicited activation in a bilateral brain network including the inferior frontal and posterior temporal areas, while sign language elicited activation in the right temporo-parietal area. A significant difference in brain lateralisation was observed between groups. Activation in the posterior temporal region was not lateralised in monolinguals and bimodal bilinguals, but right lateralised in response to both language modalities in unimodal bilinguals. This suggests that experience of two spoken languages influences brain activation for sign language when experienced for the first time. Multivariate pattern analyses (MVPA) could classify distributed patterns of activation within the left hemisphere for spoken and signed language in monolinguals (proportion correct = 0.68; p = 0.039) but not in unimodal or bimodal bilinguals. These results suggest that bilingual experience in infancy influences brain activation for language, and that unimodal bilingual experience has greater impact on early brain lateralisation than bimodal bilingual experience

    Changes in Early Cortical Visual Processing Predict Enhanced Reactivity in Deaf Individuals

    Get PDF
    Individuals with profound deafness rely critically on vision to interact with their environment. Improvement of visual performance as a consequence of auditory deprivation is assumed to result from cross-modal changes occurring in late stages of visual processing. Here we measured reaction times and event-related potentials (ERPs) in profoundly deaf adults and hearing controls during a speeded visual detection task, to assess to what extent the enhanced reactivity of deaf individuals could reflect plastic changes in the early cortical processing of the stimulus. We found that deaf subjects were faster than hearing controls at detecting the visual targets, regardless of their location in the visual field (peripheral or peri-foveal). This behavioural facilitation was associated with ERP changes starting from the first detectable response in the striate cortex (C1 component) at about 80 ms after stimulus onset, and in the P1 complex (100–150 ms). In addition, we found that P1 peak amplitudes predicted the response times in deaf subjects, whereas in hearing individuals visual reactivity and ERP amplitudes correlated only at later stages of processing. These findings show that long-term auditory deprivation can profoundly alter visual processing from the earliest cortical stages. Furthermore, our results provide the first evidence of a co-variation between modified brain activity (cortical plasticity) and behavioural enhancement in this sensory-deprived population

    Does congenital deafness affect the structural and functional architecture of primary visual cortex?

    Get PDF
    Deafness results in greater reliance on the remaining senses. It is unknown whether the cortical architecture of the intact senses is optimized to compensate for lost input. Here we performed widefield population receptive field (pRF) mapping of primary visual cortex (V1) with functional magnetic resonance imaging (fMRI) in hearing and congenitally deaf participants, all of whom had learnt sign language after the age of 10 years. We found larger pRFs encoding the peripheral visual field of deaf compared to hearing participants. This was likely driven by larger facilitatory center zones of the pRF profile concentrated in the near and far periphery in the deaf group. pRF density was comparable between groups, indicating pRFs overlapped more in the deaf group. This could suggest that a coarse coding strategy underlies enhanced peripheral visual skills in deaf people. Cortical thickness was also decreased in V1 in the deaf group. These findings suggest deafness causes structural and functional plasticity at the earliest stages of visual cortex

    Haptic spatial configuration learning in deaf and hearing individuals

    Get PDF
    The present study investigated haptic spatial configuration learning in deaf individuals, hearing sign language interpreters and hearing controls. In three trials, participants had to match ten shapes haptically to the cut-outs in a board as fast as possible. Deaf and hearing sign language users outperformed the hearing controls. A similar difference was observed for a rotated version of the board. The groups did not differ, however, on a free relocation trial. Though a significant sign language experience advantage was observed, comparison to results from a previous study testing the same task in a group of blind individuals showed it to be smaller than the advantage observed for the blind group. These results are discussed in terms of how sign language experience and sensory deprivation benefit haptic spatial configuration processing

    Modulation of Brain Activity during Action Observation: Influence of Perspective, Transitivity and Meaningfulness

    Get PDF
    The coupling process between observed and performed actions is thought to be performed by a fronto-parietal perception-action system including regions of the inferior frontal gyrus and the inferior parietal lobule. When investigating the influence of the movements' characteristics on this process, most research on action observation has focused on only one particular variable even though the type of movements we observe can vary on several levels. By manipulating the visual perspective, transitivity and meaningfulness of observed movements in a functional magnetic resonance imaging study we aimed at investigating how the type of movements and the visual perspective can modulate brain activity during action observation in healthy individuals. Importantly, we used an active observation task where participants had to subsequently execute or imagine the observed movements. Our results show that the fronto-parietal regions of the perception action system were mostly recruited during the observation of meaningless actions while visual perspective had little influence on the activity within the perception-action system. Simultaneous investigation of several sources of modulation during active action observation is probably an approach that could lead to a greater ecological comprehension of this important sensorimotor process
    • 

    corecore